What is Program-Based Design?
Program-based (sometimes referred to as ‘programmatic’, ‘program-level’ or ‘program-centric’) curriculum and assessment design has two key features. First, it starts from considering an ‘ideal’ sequence or pathway that a student in a particular program should follow in order to achieve the stated learning outcomes. The program director and unit convenors collectively think about what the ‘ideal pathway’ should be and what the student will know and be able to do at different stages of their program.
Another key feature of the programmatic approach is its collaborative nature. It brings together all the convenors on the program and asks them to map the content, activities and assessments of all units in order to align them and make sure that they are at the right level and provide students with sufficient opportunities to develop different skills. Considering how individual units fit in the overall program gives convenors a good sense of how students experience the whole program sequentially, and whether the activities and assessments in their respective units provide students with a gradual progression within the program. In other words, rather than designing individual units and then putting them together in a program (bringing a dish to a dinner party without knowing what other guests will bring), convenors collaboratively design the program (cook together). At Macquarie we have successfully done that with our ‘Design, Develop, Implement’ methodology that brought together complete program teams to design or redesign their programs.
Why adopt a programmatic approach?
Most programs that were not designed in a programmatic way tend to have gaps in what is being taught and assessed. Some capabilities and skills tend to get over-assessed (e.g. ability to write essays) while others may not get assessed at all. It is not a coincidence, as it is nearly impossible to identify specific gaps in programs without taking a ‘big picture’ programmatic perspective.
Can you tell us about some practical examples of programmatic curriculum and assessment design at Macquarie?
One good example of a programmatic approach is the Bachelor of Clinical Science, a new accelerated program at the Faculty of Medicine and Health Sciences. Being a brand-new program, we had an opportunity to design it using programmatic principles. The initial development of the program was based on the Design, Develop, Implement methodology.
How was it done?
The Program Director, unit convenors and learning and teaching staff came together in a series of collaborative workshops. Convenors were asked to think of the ‘big picture’ and agree on what an ‘ideal’ graduate of their program was like and what skills and knowledge they were supposed to have.
The team identified three key areas: (1) scholar and scientist; (2) engaged citizen and (3) professional. They also produced a list of specific competencies, knowledge and skills that an ‘ideal’ graduate would possess (e.g. behaves with integrity and compassion, reflective learner, effective communicator, socially and culturally capable, collaborator and team player). They also described the student journey in a lot of detail, for example, what would our students be able/do after 6 months, 12 months, 18 months, etc. This shared understanding of the ‘final destination’ and the ‘journey’ laid a firm foundation for the program team.
We then ran several hands-on workshops, where the convenors worked on the initial design of their units, and shared their thoughts with their colleagues. It helped to ensure a good alignment of unit content and assessments throughout the program. We also introduced an ‘umbrella’ assessment that spans through the whole program. It’s an ongoing, non-stop unit that focuses specifically on the development of the agreed ‘ideal graduate’ skills and competencies. In this unit the students are invited to reflect on how their assessments in various units contribute to their overall development as future graduates. This exercise reminds students that they are developing in a two-year program, rather than a six-month unit, and that the sum is larger than its parts.
We chose e-Portfolios as the best way for our students to keep track of their goals and progress. These portfolios are reviewed at key points through the session to provide formative feedback, and assessed at the end of each session for summative and formative feedback by the program team, including an expert in programmatic assessment. We call these ‘Progression Assessment Points”, and they provide students with feedback on their overall development. Next year, we plan to involve second year students as mentors for the first year students.
As it was a large undertaking, we set up a Steering Committee that included people from the teaching team, program directors, other Heads of Departments, educational design staff and members from student services. The committee initially met weekly, and eventually moved to fortnightly and monthly meetings. In hindsight, having this committee was a brilliant idea, as it helped all key stakeholders to be on the same page, and build a community of practice.
Your example was about a brand-new program. What would you recommend for existing programs that may want a less ‘involved’ way to do programmatic design?
There are many different ways to do programmatic design. For example, one ‘less involved’ way might be to simply align assessments via ‘visualising’ workshops. It can be done in two to three meetings where convenors can use a visualisation method like multi-coloured stick-it notes, for example, to talk about their existing assessments. They can collectively review whether there is a good balance of assessment, and whether assessments are done at the appropriate level. Another useful technique is for convenors to prepare one slide with the overview of their unit and assessments and talk about it for one to two minutes. This can be done in their Faculty Learning and Teaching Committees in order to receive peer feedback. There are many other methods to visualise assessments or other unit elements and communicate them to colleagues. The key is having a conversation and taking the time to see the unit from the program perspective.
Can we sum up some tips for the programmatic curriculum design and assessment?
- Clearly define the role of Program Director and empower them to take an active role.
- Engage convenors early, help them see their program as a whole, rather than a collection of smaller parts.
- Have specific points to discuss: bringing people together will be a waste of time unless you have specific questions.
- Encourage convenors to define an ‘ideal learning pathway’ in a lot of detail. The more the better.
- Allow enough time for feedback and discussing the ‘program framework’. In our case we allowed 2.5 months.
- Use visualising techniques, like a big piece of paper and post-it notes of different colours. It’ll help convenors to see if there is an imbalance in assessments or content.
- Have a clear plan how the program level outcomes will be communicated to students, and how they will be reminded that they are developing throughout the program, rather than individual units.
[…] the Teche archives, this post shows how the Course Director, unit convenors and learning and teaching staff collaborated in a […]